Harry's ReadMe File and the Real Scam of ClimateGate

Steven Dutch, Natural and Applied Sciences, Universityof Wisconsin - Green Bay
First-time Visitors: Please visit Site Map and Disclaimer. Use"Back" to return here.


A Note to Visitors

I will respond to questions and comments as time permits, but if you want to take issuewith any position expressed here, you first have to answer this question:

What evidence would it take to prove your beliefs wrong?

I simply will not reply to challenges that do not address this question. Refutabilityis one of the classic determinants of whether a theory can be called scientific. Moreover,I have found it to be a great general-purpose cut-through-the-crap question to determinewhether somebody is interested in serious intellectual inquiry or just playing mind games.Note, by the way, that I am assuming the burden of proof here - all youhave to do is commit to a criterion for testing.It's easy to criticize science for being "closed-minded". Are you open-mindedenough to consider whether your ideas might be wrong?


One of the most infamous "smoking guns" in the ClimateGate scandal is a file called HARRY_READ_ME, a chronicle of efforts by a programmer to get data and programs related to global warming to work properly. But after reading the file, several crucial points leap out at me:

  1. Harry isn't trying to manipulate the data to reach a pre-ordained conclusion. He's trying to get the software to work, and to reproduce previously published results.
  2. Nobody is trying to manipulate the data to reinforce global warming. There's not a word in the file to indicate Harry is working toward generating a publishable result. That will come only after everything is working smoothly. And that doesn't happen.
  3. None of the snippets published in the web pages critical of global warming have anything remotely to do with what Harry is actually trying to accomplish, which means...
  4. All those pages out there trumpeting this as irrefutable proof of scientists colluding to delude the public are written by people too computer illiterate to understand the file, and in most cases they know nothing at all. They are merely parroting stuff pulled out by other people.

"I'm a Computer Expert and I...."

Point 4 above applies in spades to anyone who has something like the quote above on his page. Youy're a computer expert? Oooh, color me impressed. Which computer expert are you?

So if you have anything like the quote above on your page, you're a complete idiot and a high school kid writing BASIC could probably program better than you. In fact, if you write in to me I will forward your e-mail to your employer (or your customers if you're the boss) so they can see what kind of doofus is handling their IT.

Let The Games Begin

1. Two main filesystems relevant to the work:

/cru/dpe1a/f014
/cru/tyn1/f014

Both systems copied in their entirety to /cru/cruts/

Nearly 11,000 files! And about a dozen assorted 'read me' files addressing individual issues, the most useful being:

fromdpe1a/data/stnmon/doc/oldmethod/f90_READ_ME.txt
fromdpe1a/code/linux/cruts/_READ_ME.txt
fromdpe1a/code/idl/pro/README_GRIDDING.txt

(yes, they all have different name formats, and yes, one does begin '_'!)

Cue the ominous music. There are 11,000 files, some programs, some original data, probably most are derived files. Some past programmers and data creators documented their stuff but the non-standard formats are a red flag that this may not end well.

2. After considerable searching, identified the latest database files for tmean:

fromdpe1a/data/cruts/database/+norm/tmp.0311051552.dtb fromdpe1a/data/cruts/database/+norm/tmp.0311051552.dts

(yes.. that is a directory beginning with '+'!)

3. Successfully ran anomdtb.f90 to produce anomaly files (as per item 7 in the '_READ_ME.txt' file). Had to make some changes to allow for the move back to alphas (different field length from the 'wc -l' command).

4. Successfully ran the IDL regridding routine quick_interp_tdm.pro (why IDL?! Why not F90?!) to produce '.glo' files.

5. Currently trying to convert .glo files to .grim files so that we can compare with previous output. However the program suite headed by globulk.f90 is not playing nicely - problems with it expecting a defunct file system (all path widths were 80ch, have been globally changed to 160ch) and also no guidance on which reference files to choose. It also doesn't seem to like files being in any directory other than the current one!!

Houston, we have a problem. First of all note that he wants to compare the results with previous output. He's validating the software, something I've had several people pompously assure me that they do because they work in the real world. But what assurance does he have that files will necessarily have the same format just because they have the same extension? That's like me assuming that your .xls file on quarterly sales has the same format as my .xls file on earthquakes.

 


Return to Pseudoscience Index
Return to Professor Dutch's Home Page

Created 18 July 2008;  Last Update 24 May, 2020

Not an official UW Green Bay site